Steepest descent approximations in Banach space
نویسندگان
چکیده
Let E be a real Banach space and let A : E → E be a Lipschitzian generalized strongly accretive operator. Let z ∈ E and x0 be an arbitrary initial value in E for which the steepest descent approximation scheme is defined by xn+1 = xn − αn(Ayn − z), yn = xn − βn(Axn − z), n = 0, 1, 2 . . . , where the sequences {αn} and {βn} satisfy the following conditions: (i) 0 ≤ αn, βn ≤ 1, (ii) ∞
منابع مشابه
Hybrid steepest-descent method with sequential and functional errors in Banach space
Let $X$ be a reflexive Banach space, $T:Xto X$ be a nonexpansive mapping with $C=Fix(T)neqemptyset$ and $F:Xto X$ be $delta$-strongly accretive and $lambda$- strictly pseudocotractive with $delta+lambda>1$. In this paper, we present modified hybrid steepest-descent methods, involving sequential errors and functional errors with functions admitting a center, which generate convergent sequences ...
متن کاملA Generalized Hybrid Steepest-Descent Method for Variational Inequalities in Banach Spaces
The hybrid steepest-descent method introduced by Yamada 2001 is an algorithmic solution to the variational inequality problem over the fixed point set of nonlinear mapping and applicable to a broad range of convexly constrained nonlinear inverse problems in real Hilbert spaces. Lehdili and Moudafi 1996 introduced the new prox-Tikhonov regularization method for proximal point algorithm to genera...
متن کاملFinsler Steepest Descent with Applications to Piecewise-regular Curve Evolution
This paper introduces a novel steepest descent flow in Banach spaces. This extends previous works on generalized gradient descent, notably the work of Charpiat et al. [12], to the setting of Finsler metrics. Such a generalized gradient allows one to take into account a prior on deformations (e.g., piecewise rigid) in order to favor some specific evolutions. We define a Finsler gradient descent ...
متن کاملMinimization of Tikhonov Functionals in Banach Spaces
Tikhonov functionals are known to be well suited for obtaining regularized solutions of linear operator equations. We analyze two iterative methods for finding the minimizer of norm-based Tikhonov functionals in Banach spaces. One is the steepest descent method, whereby the iterations are directly carried out in the underlying space, and the other one performs iterations in the dual space. We p...
متن کاملA Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems
In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...
متن کامل